Matrix Decompositions

Understanding Eigenvalue decompositions and SVD.
Author

Vannsh Jani

Published

August 23, 2023

1. Eigenvalue Decomposition

Matrix decompositions are needed to solve large system of linear equations or diagonalize a matrix on a computer. Eigenvalue decomposition of a square matrix is factorizing it into a product of three matrices representing it’s eigenvectors and eigenvalues.

Let’s say we take a NxN shaped matrix A and we want to decompose it.
we consider a matrix U such that all the eigenvectors of A are the column vectors of U.
Then we can say
\[ U=\begin{bmatrix} u_1 & u_2 \cdots&u_n \end{bmatrix} \]
where \(u_1,u_2\cdots u_n\) are eigenvectors of A.
If we post-multiply A with U we get,
\[ \begin{equation} \begin{split} AU &= \begin{bmatrix} Au_1 & Au_2 \cdots&Au_n \end{bmatrix} \\ &= \begin{bmatrix} \lambda_1 u_1 & \lambda_2u_2 \cdots&\lambda_nu_n \end{bmatrix} \end{split} \end{equation} \]
as \(u_i\) is the eigenvector of A \(\forall \quad i \in[1,..n]\quad Au_i=\lambda_iu_i\), where \(\lambda_i\) is the corresponding eigenvalue.
Next, we can split the RHS into U and a diagonal matrix containing the eigenvalues of A\((\Lambda)\). \[\begin{equation} \begin{split} AU &=\begin{bmatrix} u_1 & u_2 \cdots&u_n \end{bmatrix} \begin{bmatrix} \lambda_1 & 0 \cdots& 0 \\ 0 & \lambda_2 \cdots& 0 \\ \vdots &\quad \vdots \quad \ddots & \vdots \\ 0 & 0 \cdots& \lambda_n \end{bmatrix} \\ AU &= U\Lambda \end{split} \end{equation}\]
Now, we know that U is an invertible matrix(column vectors of U are linearly independant as they are the eigenvectors of A) hence, we can post-multiply with \(U^{-1}\) on both sides.
\[ A=U\Lambda U^{-1} \]
The eigenvalue decomposition of A is hence complete.

There is a special case to the above scenario. If A is a symmetric matrix, then it’s eigenvalues are real and it’s eigenvectors are orthogonal\((i.e. v_i.v_j=0 \quad where \quad i \not=j)\). Further, if the eigenvectors are also orthonormal\((i.e. v_i.v_i=1 \quad where \quad i \in[1,..n])\), then we can conclude that,
\[ Q=U^TU=I \]and hence to calculate the inverse of a symmetric matrix, we can calculate it’s transpose which reduces the complexity by n-fold.

Moreover, we can generalize to higher powers of A,
\[ \begin{equation} \begin{split} A &= U\Lambda U^{-1} \\ A^2 &= U\Lambda^2 U^{-1} \\ \vdots \\ A^n &= U\Lambda^n U^{-1} \end{split} \end{equation} \]
If we have to compute higher powers of A, we can use the right hand side of the above equation to compute the higher power of \(\Lambda\) which is a diagonal matrix. The higher power of a diagonal matrix can be calculated by raising the diagonal elements to that power, and thus, eigenvalue deconposition can reduce the computational cost.

Let’s now apply eigenvalue decomposition to a matrix.